Interlaced video

Interlaced video is a technique of doubling the perceived frame rate introduced with the composite video signal used with analog television without consuming extra bandwidth. Since the interlaced signal contains the two fields of a video frame shot at two different times, it enhances motion perception to the viewer and reduces flicker by taking advantage of the persistence of vision effect. This results in an effective doubling of time resolution (also called temporal resolution) as compared with non-interlaced footage (for frame rates equal to field rates). Interlaced signals require a display that is natively capable of showing the individual fields in a sequential order. Only ALiS plasma panels and traditional cathode ray tube (CRT) TV sets are capable of displaying interlaced signals, due to the electronic scanning and lack of apparent fixed-resolution.

Interlaced scan refers to one of two common methods for "painting" a video image on an electronic display screen (the other being progressive video) by scanning or displaying each line or row of pixels. This technique uses two fields to create a frame. One field contains all the odd lines in the image, the other contains all the even lines of the image. A PAL-based television set display, for example, scans 50 fields every second (25 odd and 25 even). The two sets of 25 fields work together to create a full frame every 1/25 of a second, resulting in a display of 25 frames per second, but with a new half frame every 1/50 of a second.

To display interlaced video on progressive scan displays, deinterlacing is applied to the video signal.

Despite arguments against it,[1][2] interlacing continues to be supported by the television standards organizations. It is still included in digital video transmission formats such as DV, DVB, and ATSC. Some video compression standards in development, like High Efficiency Video Coding, target high-definition progressive video and do not support interlaced formats.

Contents

Description

With progressive scan, an image is captured, transmitted, and displayed in a path similar to text on a page: line by line, from top to bottom. The interlaced scan pattern in a CRT (cathode ray tube) display completes such a scan too, but only for every second line. This is carried out from the top left corner to the bottom right corner of a CRT display. This process is repeated again, only this time starting at the second row, in order to fill in those particular gaps left behind while performing the first progressive scan on alternate rows only.

Such scan of every second line is called interlacing. A field is an image that contains only half of the lines needed to make a complete picture. The afterglow of the phosphor of CRTs, in combination with the persistence of vision results in two fields being perceived as a continuous image, which allows the viewing of full horizontal detail with the same bandwidth that would be required for a full progressive scan but with twice the perceived frame rate and with the necessary CRT refresh rate to prevent flicker. Interlacing is used by all the analogue broadcast television systems in current use.

In common shorthand format identifiers like 576i50 and 720p50, the frame is specified for progressive scan formats, but for interlaced formats, the field rate is typically specified, which is twice the frame rate. This can lead to confusion because industry-standard SMPTE timecode formats always deal with frame rate, not field rate. To avoid confusion, SMPTE and EBU always use frame rate when specifying interlaced formats, i.e. 480i60, 576i50, 1080i50, and 1080i60 become 480i/30, 576i/25, 1080i/25, and 1080i/30 and it is asserted that each frame in an interlaced signal always contains two sub-fields in sequence.

Benefits of interlacing

One of the most important factors in analog television is signal bandwidth, measured in megahertz. The greater the bandwidth, the more expensive and complex is the entire production and broadcasting chain (cameras, storage systems such as tape recorders or hard disks, broadcast and reception systems such as terrestrial, cable, and satellite transmitters and receivers, or the Internet, and end-user displays such as television sets or computer display monitors).

For a given line count and refresh rate, analog interlaced video reduces the signal bandwidth by a factor of two.

Given a fixed bandwidth instead, interlace can provide a video signal with twice the display refresh rate for a given line count (versus progressive scan video at similar frame rate, for instance 1080i at 60 half-frames per second, vs. 1080p at 30 full frames per second). The higher refresh rate improves the portrayal of motion, because objects in motion are captured and their position is updated on the display more often, and when objects are more stationary the human vision combines information from multiple similar half-frames resulting in the same perceived resolution as progressive full frames. This technique is only useful though, if the source material is available in higher refresh rates. Cinema movies are typically recorded at 24fps, and gets no real benefit from common interlacing techniques.

Given both a fixed bandwidth and high refresh rate, interlaced video can also be seen as providing a higher spatial resolution than progressive scan. For instance, 1920×1080 pixel resolution interlaced HDTV with a 60 Hz field rate (known as 1080i60 or 1080i/30) has a similar bandwidth to 1280×720 pixel progressive scan HDTV with a 60 Hz frame rate (720p60 or 720p/60), but achieves approximately twice the spatial resolution for low-motion scenes.

However the bandwidth benefits only apply to analog or uncompressed digital video signal. With digital video compression, as used in all current digital TV standards, interlacing introduces some additional inefficiencies,[3] and so the transmission bandwidth savings of interlaced video over fully progressive video are minimal even with twice the frame rate, i.e. 1080p50 signal produces roughly the same bit rate as 1080i50 signal.[4] According to one MPEG-4 AVC test using a "sports-type" scene, 1080p actually requires less bandwidth to be perceived as subjectively better than the 1080i equivalent.[5]

Problems caused by interlacing

Interlaced video is designed to be captured, transmitted, or stored, and displayed in the same interlaced format. Because each frame of interlaced video is composed of two fields that are captured at different moments in time, interlaced video frames will exhibit motion artifacts known as "interlacing effects", or "combing", if the recorded objects are moving fast enough to be in different positions when each individual field is captured. These artifacts may be more visible when interlaced video is displayed at a slower speed than it was captured or when still frames are presented.

Interline twitter

Interlace introduces a potential problem called interline twitter. This aliasing effect only shows up under certain circumstances, when the subject being shot contains vertical detail that approaches the horizontal resolution of the video format. For instance, a person on television wearing a shirt with fine dark and light stripes may appear on a video monitor as if the stripes on the shirt are "twittering". Television professionals are taught to avoid wearing clothing with fine striped patterns to avoid this problem. Professional video cameras or Computer Generated Imagery systems apply a low-pass filter to the vertical resolution of the signal in order to prevent possible problems with interline twitter.

Interline twitter is the primary reason that interlacing is unacceptable for a computer display. Each scanline on a high-resolution computer monitor is typically used to display discrete pixels that do not span the scanlines above or below. When the overall interlaced framerate is 30 frames per second, a pixel that spans only one scanline is visible for 1/30 of a second followed by 1/30 of a second of darkness, reducing the per-line/per-pixel framerate to 15 frames per second.

To avoid this problem, sharp detail is typically never displayed on standard interlaced television set. When computer graphics are shown on a standard television set, the screen is treated as if it were half the resolution of what it actually is or even lower. If text is displayed, it will be large enough so that horizontal lines are never just one scanline wide. Most fonts used in television programming have wide, fat strokes, and do not include fine-detail serifs that would make the twittering more visible.

This animation demonstrates the interline twitter effect using the Indian Head test card. On the left are two progressive scan images. Center are two interlaced images. Right are two images with line doublers. Top are original resolution, bottom are with anti-aliasing. The two interlaced images use half the bandwidth of the progressive one. The interlaced scan (center) precisely duplicates the pixels of the progressive image (left), but interlace causes details to twitter. A line doubler operating in "bob" (interpolation) mode would produce the images at far right. Real interlaced video blurs such details to prevent twitter, as seen in the bottom row, but such softening (or anti-aliasing) comes at the cost of resolution. But even the best line doubler could never restore the bottom center image to the full resolution of the progressive image. Note – Because the frame rate has been slowed down by a factor of 3, you will notice additional flicker in simulated interlaced portions of this image.

Deinterlacing

Although CRTs and ALiS plasma panels can display interlaced video directly, modern computer video displays and TV sets are mostly based on LCD technology, which utilizes progressive scanning.

To allow displaying interlaced video on a progressive scan display, a process called deinterlacing is required; however it is not perfect, and it generally results in a lower resolution and various artifacts, particularly in areas with objects in motion. In order to provide the best possible picture quality for interlaced video signals, very expensive and complex devices and algorithms should be used.

For television displays, deinterlacing systems are integrated into progressive scan TV sets which accept interlaced signal, such as broadcast SDTV signal.

Most modern computer monitors do not support interlaced video besides some legacy text-only display modes; playing back interlaced video on a computer display requires some form of deinterlacing in the software player, which often uses very simple methods for interlacing so interlaced video will have visible artifacts when it is displayed on computer systems. Computer systems are frequently used to edit video, and the disparity between computer video display systems and television signal formats means that the video content being edited cannot be viewed properly unless separate video display hardware is used.

Current manufacture TV sets employ a system of intelligently extrapolating the extra information that would be present in a progressive signal entirely from an interlaced original. In theory: this should simply be a problem of applying the appropriate algorithms to the interlaced signal as all the information needed should be present in that signal. In practice: the results are, at present, somewhat variable and are dependent on the quality of the input signal and the amount of processing power applied to the conversion. The biggest impediment, at present, is the artifacts present in the lower quality interlaced signals (generally broadcast video), as these are not consistent from field to field. On the other hand, high bit rate interlaced signals such as from HD camcorders operating in their highest bit rate mode work surprisingly well.

History

When motion picture film was developed, it was observed that the movie screen had to be illuminated at a high rate to prevent visible flicker. The exact rate necessary varies by brightness, with 40 Hz being acceptable in dimly lit rooms, while up to 80 Hz may be necessary for bright displays that extend into peripheral vision. The film solution was to project each frame of film three times using a three-bladed shutter: a movie shot at 16 frames per second would thus illuminate the screen 48 times per second. Later, when sound film became available, the higher projection speed of 24 frames per second enabled a two bladed shutter to be used maintaining the 48 times per second illumination — but only in projectors that were incapable of projecting at the lower speed.

But this solution could not be used for television — storing a full video frame and scanning it twice would require a frame buffer, a method that did not become feasible until the late 1980s. In addition, avoiding on-screen interference patterns caused by studio lighting and the limits of vacuum tube technology required that CRTs for TV be scanned at AC line frequency. (This was 60 Hz in the US, 50 Hz Europe.)

In the domain of mechanical television, the concept of interlacing was demonstrated by Léon Theremin. He had been developing a mirror drum-based television, starting with 16 lines resolution in 1925, then 32 lines and eventually 64 using interlacing in 1926, and as part of his thesis on May 7, 1926, he electrically transmitted and then projected near-simultaneous moving images on a five foot square screen.[6]

The concept of breaking a single video frame into interlaced lines was first formulated and patented by German Telefunken engineer Fritz Schröter in 1930,[7] and in the USA by RCA engineer Randall C. Ballard in 1932.[8][9] Commercial implementation began in 1934 as cathode ray tube screens became brighter, increasing the level of flicker caused by progressive (sequential) scanning.[10]

In 1936, when the analog standards were being set in the UK, CRTs could only scan at around 200 lines in 1/50 of a second. By using interlace, a pair of 202.5-line fields could be superimposed to become a sharper 405 line frame. The vertical scan frequency remained 50 Hz, so flicker was not a problem, but visible detail was noticeably improved. As a result, this system was able to supplant John Logie Baird's 240 line mechanical progressive scan system that was also being used at the time.

From the 1940s onward, improvements in technology allowed the US and the rest of Europe to adopt systems using progressively more bandwidth to scan higher line counts, and achieve better pictures. However the fundamentals of interlaced scanning were at the heart of all of these systems. The US adopted the 525 line system known as NTSC, Europe adopted the 625 line system, and the UK switched from its 405 line system to 625 in order to avoid having to develop a unique method of color TV. France switched from its unique 819 line system to the more European standard of 625. Although the term PAL is often used to describe the line and frame standard of the TV system, this is in fact incorrect and refers only to the method of superimposing the colour information on the standard 625 line broadcast. The French adopted their own SECAM system which was also adopted by some other countries, notably Russia and its satellites. PAL has been used on some otherwise NTSC broadcasts notably in Brazil.

Interlacing was ubiquitous in displays until the 1970s, when the needs of computer monitors resulted in the reintroduction of progressive scan. Interlace is still used for most standard definition TVs, and the 1080i HDTV broadcast standard, but not for LCD, micromirror (DLP), or plasma displays; these displays do not use a raster scan to create an image, and so cannot benefit from interlacing: in practice, they have to be driven with a progressive scan signal. The deinterlacing circuitry to get progressive scan from a normal interlaced broadcast television signal can add to the cost of a television set using such displays. Currently, progressive displays dominate the HDTV market.

Interlace and computers

In the 1970s, computers and home video game systems began using TV sets as display devices. At this point, a 480-line NTSC signal was well beyond the graphics abilities of low cost computers, so these systems used a simplified video signal which caused each video field to scan directly on top of the previous one, rather than each line between two lines of the previous field. This marked the return of progressive scanning not seen since the 1920s. Since each field became a complete frame on its own, modern terminology would call this 240p on NTSC sets, and 288p on PAL. While consumer devices were permitted to create such signals, broadcast regulations prohibited TV stations from transmitting video like this. Computer monitor standards such as CGA were further simplifications to NTSC, which improved picture quality by omitting modulation of color, and allowing a more direct connection between the computer's graphics system and the CRT.

By the mid-1980s, computers had outgrown these video systems and needed better displays. The Apple IIgs suffered from the use of the old scanning method, with the highest display resolution being 640x200, resulting in a severely distorted tall narrow pixel shape, making the display of realistic proportioned images difficult. Solutions from various companies varied widely. Because PC monitor signals did not need to be broadcast, they could consume far more than the 6, 7 and 8 MHz of bandwidth that NTSC and PAL signals were confined to. IBM's Monochrome Display Adapter and Enhanced Graphics Adapter as well as the Hercules Graphics Card and the original Macintosh computer generated a video signal close to 350p. The Commodore Amiga created a true interlaced NTSC signal (as well as RGB variations). This ability resulted in the Amiga dominating the video production field until the mid 1990s, but the interlaced display mode caused flicker problems for more traditional PC applications where single-pixel detail is required. 1987 saw the introduction of VGA, on which PCs soon standardized, Apple only followed suit some years later with the Mac when the VGA standard was improved to match Apple's proprietary 24 bit color video standard also introduced in 1987.

In the late 1980s and early 1990s, monitor and graphics card manufacturers introduced newer high resolution standards that once again included interlace. These monitors ran at very high refresh rates, intending that this would alleviate flicker problems. Such monitors proved very unpopular. While flicker was not obvious on them at first, eyestrain and lack of focus nevertheless became a serious problem. The industry quickly abandoned this practice, and for the rest of the decade all monitors included the assurance that their stated resolutions were "non-interlaced". This experience is why the PC industry today remains against interlace in HDTV, and lobbied for the 720p standard. Also the industry is lobbying beyond 720p, actually 1080/60p for NTSC legacy countries, and 1080/50p for PAL legacy countries.

See also

References

  1. ^ Philip Laven (January 25, 2005). "EBU Technical Review No. 300 (October 2004)". EBU. http://www.ebu.ch/en/technical/trev/trev_300-ibc-2004.html. 
  2. ^ Philip Laven (January 26, 2005). "EBU Technical Review No. 301 (January 2005)". EBU. http://www.ebu.ch/en/technical/trev/trev_301-editorial.html. 
  3. ^ http://www.atd.net/HDTV_faq.html
  4. ^ "10 things you need to know about... 1080p/50" (PDF). EBU. September 2009. http://tech.ebu.ch/docs/testmaterial/ibc09_10things_1080p50.pdf. Retrieved 2010-06-26. 
  5. ^ Hoffman, Itagaki, Wood, Bock (2006-12-04). "Studies on the Bit Rate Requirements for a HDTV Format With 1920x1080 pixel Resolution, Progressive Scanning at 50 Hz Frame Rate Targeting Large Flat Panel Displays" (PDF). IEEE Transcations on Broadcasting, Vol. 52, No. 4. http://bura.brunel.ac.uk/bitstream/2438/1181/1/Fullext.pdf. Retrieved 2011-09-08. "It has been shown that the coding efficiency of 1080p/50 is very similar (simulations) or even better (subjective tests) than 1080i/25 despite the fact that twice the number of pixels have to be coded. This is due to the higher compression efficiency and better motion tracking of progressively scanned video signals compared to interlaced scanning." 
  6. ^ Glinsky, Albert (2000). Theremin: Ether Music and Espionage. Urbana, Illinois: University of Illinois Press. ISBN 0-252-02582-2.  pages 41-45
  7. ^ Registered by the German Reich patent office, patent no. 574085.
  8. ^ "Pioneering in Electronics". David Sarnoff Collection. Archived from the original on 2006-08-21. http://web.archive.org/web/20060821005430/http://www.davidsarnoff.org/kil-chapter09.htm. Retrieved 2006-07-27. 
  9. ^ U.S. patent 2,152,234. Interestingly, reducing flicker is listed only fourth in a list of objectives of the invention.
  10. ^ R.W. Burns, Television: An International History of the Formative Years, IET, 1998, p. 425. ISBN 9780852969144.

External links